51 research outputs found

    Metamodel variability analysis combining bootstrapping and validation techniques

    Get PDF
    Research on metamodel-based optimization has received considerably increasing interest in recent years, and has found successful applications in solving computationally expensive problems. The joint use of computer simulation experiments and metamodels introduces a source of uncertainty that we refer to as metamodel variability. To analyze and quantify this variability, we apply bootstrapping to residuals derived as prediction errors computed from cross-validation. The proposed method can be used with different types of metamodels, especially when limited knowledge on parameters’ distribution is available or when a limited computational budget is allowed. Our preliminary experiments based on the robust version of the EOQ model show encouraging results

    Dynamic Objectives Aggregation in Multi-objective Evolutionary Optimization

    Get PDF
    Several approaches for solving multi-objective optimization problems entail a form of scalarization of the objectives. This paper proposes a study of different dynamic objectives aggregation methods in the context of evolutionary algorithms. These methods are mainly based on both weighted sum aggregations and curvature variations. A comparison analysis is presented on the basis of a campaign of computational experiments on a set of benchmark problems from the literature.Multi-objective optimization, Evolutionary algorithms, Aggregate objective functions

    DOAM for Evolutionary Portfolio Optimization: a computational study.

    Get PDF
    In this work, the ability of the Dynamic Objectives Aggregation Methods to solve the portfolio rebalancing problem is investigated conducting a computational study on a set of instances based on real data. The portfolio model considers a set of realistic constraints and entails the simultaneously optimization of the risk on portfolio, the expected return and the transaction cost.

    An efficient decomposition approach for surgical planning

    Get PDF
    This talk presents an efficient decomposition approach to surgical planning. Given a set of surgical waiting lists (one for each discipline) and an operating theater, the problem is to decide the room-to-discipline assignment for the next planning period (Master Surgical Schedule), and the surgical cases to be performed (Surgical Case Assignment), with the objective of optimizing a score related to priority and current waiting time of the cases. While in general MSS and SCA may be concurrently found by solving a complex integer programming problem, we propose an effective decomposition algorithm which does not require expensive or sophisticated computational resources, and is therefore suitable for implementation in any real-life setting. Our decomposition approach consists in first producing a number of subsets of surgical cases for each discipline (potential OR sessions), and select a subset of them. The surgical cases in the selected potential sessions are then discarded, and only the structure of the MSS is retained. A detailed surgical case assignment is then devised filling the MSS obtained with cases from the waiting lists, via an exact optimization model. The quality of the plan obtained is assessed by comparing it with the plan obtained by solving the exact integrated formulation for MSS and SCA. Nine different scenarios are considered, for various operating theater sizes and management policies. The results on instances concerning a medium-size hospital show that the decomposition method produces comparable solutions with the exact method in much smaller computation time

    Robust optimization in simulation: Taguchi and response surface methodology

    Get PDF
    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a `robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by Response Surface Methodology (RSM). George Box originated RSM, and Douglas Montgomery recently extended RSM to robust optimization of real (non-simulated) systems. We combine Taguchi's view with RSM for simulated systems. We illustrate the resulting methodology through classic Economic Order Quantity (EOQ) inventory models, which demonstrate that robust optimization may require order quantities that differ from the classic EOQ

    Metamodel variability in robust simulation-optimization: a bootstrap analysis

    No full text
    Metamodels are often used in simulation-optimization for the design and management of complex systems. These metamodels yield insight into the relationship between responses and decision variables, providing fast analysis tools instead of the more expensive computer simulations. Moreover, these metamodels enable the integration of discipline-dependent analysis into the overall decision process. The use of stochastic simulation experiments and metamodels introduces a source of uncertainty in the decision process that we refer to as metamodel variability. To quantify this variability, we use bootstrapping. More speci�cally, we combine cross-validation and bootstrapping to simulate the metamodel construction process in stochastic environments. The resulting methodology is illustrated through the well-known Economic Order Quantity (EOQ) model using Kriging and regression metamodels. The relative validation errors are small, so they suggest that the metamodels give an adequate approximation, and bootstrapping these errors allows us to quantify the metamodels' variability in an acceptable way

    Metamodel variability in robust simulation-optimization: a bootstrap analysis

    No full text
    Metamodels are often used in simulation-optimization for the design and management of complex systems. These metamodels yield insight into the relationship between responses and decision variables, providing fast analysis tools instead of the more expensive computer simulations. Moreover, these metamodels enable the integration of discipline-dependent analysis into the overall decision process. The use of stochastic simulation experiments and metamodels introduces a source of uncertainty in the decision process that we refer to as metamodel variability. To quantify this variability, we use bootstrapping. More speci�cally, we combine cross-validation and bootstrapping to simulate the metamodel construction process in stochastic environments. The resulting methodology is illustrated through the well-known Economic Order Quantity (EOQ) model using Kriging and regression metamodels. The relative validation errors are small, so they suggest that the metamodels give an adequate approximation, and bootstrapping these errors allows us to quantify the metamodels' variability in an acceptable way
    • …
    corecore